104 research outputs found

    Two-Dimensional Source Coding by Means of Subblock Enumeration

    Full text link
    A technique of lossless compression via substring enumeration (CSE) attains compression ratios as well as popular lossless compressors for one-dimensional (1D) sources. The CSE utilizes a probabilistic model built from the circular string of an input source for encoding the source.The CSE is applicable to two-dimensional (2D) sources such as images by dealing with a line of pixels of 2D source as a symbol of an extended alphabet. At the initial step of the CSE encoding process, we need to output the number of occurrences of all symbols of the extended alphabet, so that the time complexity increase exponentially when the size of source becomes large. To reduce the time complexity, we propose a new CSE which can encode a 2D source in block-by-block instead of line-by-line. The proposed CSE utilizes the flat torus of an input 2D source as a probabilistic model for encoding the source instead of the circular string of the source. Moreover, we analyze the limit of the average codeword length of the proposed CSE for general sources.Comment: 5 pages, Submitted to ISIT201

    Asymptotic Optimality of Antidictionary Codes

    Full text link
    An antidictionary code is a lossless compression algorithm using an antidictionary which is a set of minimal words that do not occur as substrings in an input string. The code was proposed by Crochemore et al. in 2000, and its asymptotic optimality has been proved with respect to only a specific information source, called balanced binary source that is a binary Markov source in which a state transition occurs with probability 1/2 or 1. In this paper, we prove the optimality of both static and dynamic antidictionary codes with respect to a stationary ergodic Markov source on finite alphabet such that a state transition occurs with probability p(0<p1)p (0 < p \leq 1).Comment: 5 pages, to appear in the proceedings of 2010 IEEE International Symposium on Information Theory (ISIT2010

    A Universal Two-Dimensional Source Coding by Means of Subblock Enumeration

    Get PDF
    The technique of lossless compression via substring enumeration (CSE) is a kind of enumerative code and uses a probabilistic model built from the circular string of an input source for encoding a one-dimensional (1D) source. CSE is applicable to two-dimensional (2D) sources, such as images, by dealing with a line of pixels of a 2D source as a symbol of an extended alphabet. At the initial step of CSE encoding process, we need to output the number of occurrences of all symbols of the extended alphabet, so that the time complexity increases exponentially when the size of source becomes large. To reduce computational time, we can rearrange pixels of a 2D source into a 1D source string along a space-filling curve like a Hilbert curve. However, information on adjacent cells in a 2D source may be lost in the conversion. To reduce the time complexity and compress a 2D source without converting to a 1D source, we propose a new CSE which can encode a 2D source in a block-by-block fashion instead of in a line-by-line fashion. The proposed algorithm uses the flat torus of an input 2D source as a probabilistic model instead of the circular string of the source. Moreover, we prove the asymptotic optimality of the proposed algorithm for 2D general sources

    Why does arterial blood pressure rise actively during REM sleep?

    Get PDF
    A large fluctuation in autonomic function is one of the most important charac-teristics of REM sleep. Arterial blood pressure (AP) increases during the transition from non-REM to REM sleep, showing phasic surges during REM sleep. REM-associated AP changes involve 1) a long-term recovery process after surgery, 2) circadian rhythm, 3)relationships with ambient temperature. REM-associated AP changes are mediated by sympathetic nerves, buffered by baroreflex, abolished in decerebrated cats, and related to hippocampal theta activity in rats. Furthermore, the midbrain dopaminergic system has been recently found to be involved in increases in REM-associated AP

    A deazariboflavin chromophore kinetically stabilizes reduced FAD state in a bifunctional cryptochrome

    Get PDF
    An animal-like cryptochrome derived from Chlamydomonas reinhardtii (CraCRY) is a bifunctional flavoenzyme harboring flavin adenine dinucleotide (FAD) as a photoreceptive/catalytic center and functions both in the regulation of gene transcription and the repair of UV-induced DNA lesions in a light-dependent manner, using different FAD redox states. To address how CraCRY stabilizes the physiologically relevant redox state of FAD, we investigated the thermodynamic and kinetic stability of the two-electron reduced anionic FAD state (FADH−) in CraCRY and related (6–4) photolyases. The thermodynamic stability of FADH− remained almost the same compared to that of all tested proteins. However, the kinetic stability of FADH− varied remarkably depending on the local structure of the secondary pocket, where an auxiliary chromophore, 8-hydroxy-7,8-didemethyl-5-deazariboflavin (8-HDF), can be accommodated. The observed effect of 8-HDF uptake on the enhancement of the kinetic stability of FADH− suggests an essential role of 8-HDF in the bifunctionality of CraCRY.Hosokawa Y., Morita H., Nakamura M., et al. A deazariboflavin chromophore kinetically stabilizes reduced FAD state in a bifunctional cryptochrome. Scientific Reports 13, 16682 (2023); https://doi.org/10.1038/s41598-023-43930-0

    Compression by Substring Enumeration Using Sorted Contingency Tables

    Get PDF
    This paper proposes two variants of improved Compression by Substring Enumeration (CSE) with a finite alphabet. In previous studies on CSE, an encoder utilizes inequalities which evaluate the number of occurrences of a substring or a minimal forbidden word (MFW) to be encoded. The inequalities are derived from a contingency table including the number of occurrences of a substring or an MFW. Moreover, codeword length of a substring and an MFW grows with the difference between the upper and lower bounds deduced from the inequalities, however the lower bound is not tight. Therefore, we derive a new tight lower bound based on the contingency table and consequently propose a new CSE algorithm using the new inequality. We also propose a new encoding order of substrings and MFWs based on a sorted contingency table such that both its row and column marginal total are sorted in descending order instead of a lexicographical order used in previous studies. We then propose a new CSE algorithm which is the first proposed CSE algorithm using the new encoding order. Experimental results show that compression ratios of all files of the Calgary corpus in the proposed algorithms are better than those of a previous study on CSE with a finite alphabet. Moreover, compression ratios under the second proposed CSE get better than or equal to that under a well-known compressor for 11 files amongst 14 files in the corpus

    The N400event-related potential in aphasia

    Get PDF
    Although the N400 component of event-related potentials (ERPs) is suggested to reflect language processing, exactly which language processing functions N400 is sensitive to is not clear. We investigated this component in aphasic patients with some impairments of language processing. Meaningful and meaningless words in Kana (Japanese characters) were used as stimuli under a visual oddball paradigm. Increases in N400 latency and amplitude in the aphasic group were significant in comparison with the control group. In the aphasic group, N400 latency correlated significantly with the performance intelligence quotient employed besides language quotients. Moreover, the N400effects were seen more clearly in the left hemisphere than in the right hemisphere for both groups. We propose that the abnormal variations in amplitude or latency of N400 in the aphasic group reflect language processing functions (controlled processing and automatic processing) that are different between slight and severe cases of aphasia. Moreover, N400 effects are sensitive to intellectual abilities besides language ability. We also suggest that N400 effects in the left hemisphere for the aphasic group are a reflection of active language processing as the substitution function

    EEG alpha power and laterality during dreaming in N-REM and REM sleep

    Get PDF
    In the present study,electroencephalic (EEG) alpha power,alpha frequency and their individual laterality were studied. The dream data used in this study were collected during N-REM and REM sleeps for 3 or 4 nights. In N-REM sleep,the alpha power spectrum tend to show higher value in dreaming than in non dreaming. On the other hand,the decrement of the alpha power in REM sleep was rather apparent during dreaming. The correlation between the alpha power and the mean alpha frequency was significantly positive,while as ignificantly negative correlation was seen in N-REM sleep. Finally,we discussed the relation between the cortical activation and the alpha activities in dreaming on the basis of the results we obtained
    corecore